Variational Bayesian Inference with Stochastic Search

نویسندگان

  • John William Paisley
  • David M. Blei
  • Michael I. Jordan
چکیده

Mean-field variational inference is a method for approximate Bayesian posterior inference. It approximates a full posterior distribution with a factorized set of distributions by maximizing a lower bound on the marginal likelihood. This requires the ability to integrate a sum of terms in the log joint likelihood using this factorized distribution. Often not all integrals are in closed form, which is typically handled by using a lower bound. We present an alternative algorithm based on stochastic optimization that allows for direct optimization of the variational lower bound. This method uses control variates to reduce the variance of the stochastic search gradient, in which existing lower bounds can play an important role. We demonstrate the approach on two non-conjugate models: logistic regression and an approximation to the HDP.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Truncation-free Stochastic Variational Inference for Bayesian Nonparametric Models

We present a truncation-free stochastic variational inference algorithm for Bayesian nonparametric models. While traditional variational inference algorithms require truncations for the model or the variational distribution, our method adapts model complexity on the fly. We studied our method with Dirichlet process mixture models and hierarchical Dirichlet process topic models on two large data...

متن کامل

Stochastic Variational Inference for Bayesian Sparse Gaussian Process Regression

This paper presents a novel variational inference framework for deriving a family of Bayesian sparse Gaussian process regression (SGPR) models whose approximations are variationally optimal with respect to the full-rank GPR model enriched with various corresponding correlation structures of the observation noises. Our variational Bayesian SGPR (VBSGPR) models jointly treat both the distribution...

متن کامل

Variational Bayesian Inference for Partially Observed Diffusions

In this paper the variational Bayesian approximation for partially observed continuous time stochastic processes is studied. We derive an EM-like algorithm and give its implementations. The variational Expectation step is explicitly solved using the method of conditional moment generating functions and stochastic partial differential equations. The numerical experiments demonstrate that the var...

متن کامل

Variational Bayesian inference for partially observed stochastic dynamical systems

In this paper the variational Bayesian approximation for partially observed continuous time stochastic processes is studied. We derive an EM-like algorithm and describe its implementation. The variational Expectation step is explicitly solved using the method of conditional moment generating functions and stochastic partial differential equations. The numerical experiments demonstrate that the ...

متن کامل

Collapsed Variational Bayesian Inference for PCFGs

This paper presents a collapsed variational Bayesian inference algorithm for PCFGs that has the advantages of two dominant Bayesian training algorithms for PCFGs, namely variational Bayesian inference and Markov chain Monte Carlo. In three kinds of experiments, we illustrate that our algorithm achieves close performance to the Hastings sampling algorithm while using an order of magnitude less t...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2012